Hierarchically coupled causal models#
This notebook showcases how to generate hierarchically coupled causal models.
# Autoreload extension
%load_ext autoreload
%autoreload 2
%matplotlib inline
The autoreload extension is already loaded. To reload it, use:
%reload_ext autoreload
import torch
import xarray as xr
import matplotlib.pyplot as plt
import matplotlib as mpl
from causaldynamics.utils import set_rng_seed
from causaldynamics.initialization import initialize_weights, initialize_biases, initialize_x, initialize_system_and_driver
from causaldynamics.scm import create_scm_graph, get_root_nodes_mask, GrowingNetworkWithRedirection
from causaldynamics.mlp import propagate_mlp
from causaldynamics.systems import solve_system, solve_random_systems
from causaldynamics.plot import plot_trajectories, animate_3d_trajectories, plot_scm
# Set a fixed seed for reproducibility
set_rng_seed(42)
# Increase the animation limit to 50MB
mpl.rcParams['animation.embed_limit'] = 50 * 1024**2
Generate data the convenient way#
This is the recommended convenient way of generating data. If you need more flexibility, look at the step by step guide below.
from causaldynamics.creator import create_scm, simulate_system, create_plots
num_nodes = 2
node_dim = 3
num_timesteps = 200
confounders=False,
system_name='Lorenz'
A, W, b, root_nodes, magnitudes = create_scm(num_nodes,
node_dim,
confounders=confounders)
data = simulate_system(A, W, b,
num_timesteps=num_timesteps,
num_nodes=num_nodes,
system_name=system_name)
create_plots(
data,
A,
root_nodes=root_nodes,
out_dir='.',
show_plot=True,
save_plot=False,
return_html_anim=True,
create_animation=False,
)
INFO - Creating SCM with 2 nodes and 3 dimensions each...
INFO - Simulating Lorenz system for 200 timesteps...
INFO - Generating visualizations
Generate data step by step#
Let’s explore step-by-step what happens internally in the create_scm and simulate_system functions.
Let’s create a minimal example with a causal graph that has two nodes 0 and 1 and a single edge 1<-0. The signal we send through that graph is given by a Lorenz attractor with dimension 3 that is calculated for 50 time steps.
# Define parameters for a small test example
N_nodes = 2
N_timesteps = 1000
N_dimensions = 3
# Generate Lorenz attractor trajectory
d_lorenz = solve_system(N_timesteps, N_nodes, "Lorenz")
# Sample the simplest possible adjacency matrix: 0<--1
A = GrowingNetworkWithRedirection(N_nodes).generate()
A
tensor([[0., 0.],
[1., 0.]])
# Sample random weights for the MLPs
W = initialize_weights(N_nodes, N_dimensions)
W
tensor([[[ 1.8306, 0.1129, 0.4252],
[-1.3446, 1.6114, 0.5914],
[-1.1415, -0.5930, -0.0058]],
[[ 1.4170, -2.0788, 0.6370],
[ 1.3824, -0.9156, 0.0159],
[ 0.7230, -0.3092, 0.8614]]])
# Sample random biases for the MLPs
b = initialize_biases(N_nodes, N_dimensions)
b
tensor([[ 1.2865, -1.7983, 0.3016],
[ 2.6508, -1.3340, 0.6698]])
Let’s plot the minimal test graph 1<–0 and propagate the Lorenz attractor through the causal graph.
First, nitialize the root node 1 with the time-series data from the Lorenz attractor. Then, propagate this signal through the edge 1->0. The signal is the input to the sampled multilayer perceptron (MLP) with no activation function. The output is the state of node 0.
# Initialize the root nodes with the Lorenz attractor
init = initialize_x(d_lorenz, A)
# Propagate Lorenz trajectory through the SCM
x = propagate_mlp(A, W, b, init=init)
# Visualize the results using xarray
da = xr.DataArray(x, dims=['time', 'node', "dim"])
root_nodes = get_root_nodes_mask(A)
plot_scm(G=create_scm_graph(A), root_nodes=root_nodes)
plt.show()
plot_trajectories(da, root_nodes=root_nodes, sharey=False)
plt.show()
# Create an animation of the simple coupled Lorenz attractor.
da_sel = da.isel(time=slice(0, 500))
anim = animate_3d_trajectories(da_sel, frame_skip=2, rotate=True , show_history=True, plot_type='subplots', root_nodes=root_nodes)
display(anim)